neuromorphic technology
Human Brain Project, Intel Work Together to Advance Neuromorphic Technology
To achieve this, the researchers link two types of deep learning networks. The feedback neuronal networks are responsible for "short-term memory," and recurrent modules filter out possible relevant information from the input signal and store it. A feed-forward network determines which of the relationships found are important for solving the current task. Relationships that are meaningless are filtered out, and the neurons only fire in those modules where relevant information has been found. This entire process is what leads to dramatic energy savings.
Intel Debuts Pohoiki Beach, Its 8M Neuron Neuromorphic Development System
Neuromorphic computing has received less fanfare of late than quantum computing whose mystery has captured public attention and which seems to have generated more efforts (academic, government, and commercial) but whose payoff also seems more distant. Intel's introduction this week of Pohoiki Beach – an 8-million-neuron, neuromorphic system using 64 Loihi research chips – brings some (needed) attention back to neuromorphic technology. The newest system will be available to Intel's roughly 60 neuromorphic ecosystem partners and represents a significant scaling up of its development platform with more to come; Intel reportedly plans to introduce a 768-chip, 100-million-neuron system (Pohoiki Springs) near the end of 2019. "Researchers can now efficiently scale up novel neural-inspired algorithms – such as sparse coding, simultaneous localization and mapping (SLAM), and path planning – that can learn and adapt based on data inputs. Pohoiki Beach represents a major milestone in Intel's neuromorphic research, laying the foundation for Intel Labs to scale the architecture to 100 million neurons later this year," according to the official announcement.
Why neuromorphic technology is the key to future AI
With the commercial developments, a neuromorphic chip made by IBM contains five times as many transistors as a standard Intel processor, Wired reports, yet it consumes only 70 milliwatts of power. The figures demonstrated "neuromorphic" responses by processing sensory data, including images of objects and sounds and they were able to react to changes. Neuromorphic engineering requires developers to understand how the morphology of individual neurons, circuits, applications affects how information is represented. The technology requires input from biology, physics, mathematics, computer science, and electronic engineering disciplines.
Why neuromorphic technology is the key to future AI
The idea is to develop microprocessors configured more like human brains than traditional silicon chips with the aim of making computers more astute about the environment; this is seen as step-forwards with artificial intelligence. Neuroinformatics refers to the creation of neuromorphic chips that can replicate the brain's information processing capabilities in real-time. Key players in the development of neuromoprhic computing are Qualcomm, IBM, HRL Laboratories and the Human Brain Project. The Human Brain Project is a 10-year project seeking to simulate a complete human brain in a supercomputer using biological data. With the commercial developments, a neuromorphic chip made by IBM contains five times as many transistors as a standard Intel processor, Wired reports, yet it consumes only 70 milliwatts of power.
- Information Technology (0.95)
- Semiconductors & Electronics (0.83)
- Telecommunications (0.63)